274 research outputs found
Entry Selection
It is well-known in the IO literature that incumbent firms may want to deter entry by behaving as if they are efficient. In this paper we show that incumbents may sometimes prefer to encourage entry by mimicking the behaviour of a less efficient firm for the following reason. If the incumbent cannot deter potential efficient entrants, he may want to elicit entry by an inefficient firm who would not enter if he knows that the incumbent is efficient. The presence of the additional firm in the market prevents further entry. The incumbent then faces a less efficient competitor in the long run.Duopoly competition, entry deterrence, signalling weakness
Fixed costs matter even when the costs are sunk
How firms set prices is key to understanding markets. Standard economics dictates that the fixed costs of a firm should not affect its prices. Nonetheless, it is common practice for firms to raise their prices after a fixed costs increase. We show that firms are correct in doing so if two ubiquitous conditions apply: (i) future profits increase in current sales and (ii) firms are liquidity-constrained
Achievable performance of blind policies in heavy traffic
For a GI/GI/1 queue, we show that the average sojourn time under the (blind) Randomized Multilevel Feedback algorithm is no worse than that under the Shortest Remaining Processing Time algorithm times a logarithmic function of the system load. Moreover, it is verified that this bound is tight in heavy traffic, up to a constant multiplicative factor. We obtain this result by combining techniques from two disparate areas: competitive analysis and applied probability
Uniform asymptotics for compound Poisson processes with regularly varying jumps and vanishing drift
This paper addresses heavy-tailed large-deviation estimates for the distribution tail of functionals of a class of spectrally one-sided LĂ©vy processes. Our contribution is to show that these estimates remain valid in a near-critical regime. This complements recent similar results that have been obtained for the all-time supremum of such processes. Specifically, we consider local asymptotics of the all-time supremum, the supremum of the process until exiting [0,â), the maximum jump until that time, and the time it takes until exiting [0,â). The proofs rely, among other things, on properties of scale functions
Achievable performance of blind policies in heavy traffic
For a GI/GI/1 queue, we show that the average sojourn time under the (blind) Randomized Multilevel Feedback algorithm is no worse than that under the Shortest Remaining Processing Time algorithm times a logarithmic function of the system load. Moreover, it is verified that this bound is tight in heavy traffic, up to a constant multiplicative factor. We obtain this result by combining techniques from two disparate areas: competitive analysis and applied probability
Heavy-traffic analysis of sojourn time under the foregroundâbackground scheduling policy
We consider the steady-state distribution of the sojourn time of a job entering an M/GI/1 queue with the foregroundâbackground scheduling policy in heavy traffic. The growth rate of its mean as well as the limiting distribution are derived under broad conditions. Assumptions commonly used in extreme value theory play a key role in both the analysis and the results
Glauber dynamics in a single-chain magnet: From theory to real systems
The Glauber dynamics is studied in a single-chain magnet. As predicted, a
single relaxation mode of the magnetization is found. Above 2.7 K, the
thermally activated relaxation time is mainly governed by the effect of
magnetic correlations and the energy barrier experienced by each magnetic unit.
This result is in perfect agreement with independent thermodynamical
measurements. Below 2.7 K, a crossover towards a relaxation regime is observed
that is interpreted as the manifestation of finite-size effects. The
temperature dependences of the relaxation time and of the magnetic
susceptibility reveal the importance of the boundary conditions.Comment: Submitted to PRL 10 May 2003. Submitted to PRB 12 December 2003;
published 15 April 200
Sandpiles with height restrictions
We study stochastic sandpile models with a height restriction in one and two
dimensions. A site can topple if it has a height of two, as in Manna's model,
but, in contrast to previously studied sandpiles, here the height (or number of
particles per site), cannot exceed two. This yields a considerable
simplification over the unrestricted case, in which the number of states per
site is unbounded. Two toppling rules are considered: in one, the particles are
redistributed independently, while the other involves some cooperativity. We
study the fixed-energy system (no input or loss of particles) using cluster
approximations and extensive simulations, and find that it exhibits a
continuous phase transition to an absorbing state at a critical value zeta_c of
the particle density. The critical exponents agree with those of the
unrestricted Manna sandpile.Comment: 10 pages, 14 figure
Accurate training of the Cox proportional hazards model on vertically-partitioned data while preserving privacy
BACKGROUND: Analysing distributed medical data is challenging because of data sensitivity and various regulations to access and combine data. Some privacy-preserving methods are known for analyzing horizontally-partitioned data, where different organisations have similar data on disjoint sets of people. Technically more challenging is the case of vertically-partitioned data, dealing with data on overlapping sets of people. We use an emerging technology based on cryptographic techniques called secure multi-party computation (MPC), and apply it to perform privacy-preserving survival analysis on vertically-distributed data by means of the Cox proportional hazards (CPH) model. Both MPC and CPH are explained. METHODS: We use a Newton-Raphson solver to securely train the CPH model with MPC, jointly with all data holders, without revealing any sensitive data. In order to securely compute the log-partial likelihood in each iteration, we run into several technical challenges to preserve the efficiency and security of our solution. To tackle these technical challenges, we generalize a cryptographic protocol for securely computing the inverse of the Hessian matrix and develop a new method for securely computing exponentiations. A theoretical complexity estimate is given to get insight into the computational and communication effort that is needed. RESULTS: Our secure solution is implemented in a setting with three different machines, each presenting a different data holder, which can communicate through the internet. The MPyC platform is used for implementing this privacy-preserving solution to obtain the CPH model. We test the accuracy and computation time of our methods on three standard benchmark survival datasets. We identify future work to make our solution more efficient. CONCLUSIONS: Our secure solution is comparable with the standard, non-secure solver in terms of accuracy and convergence speed. The computation time is considerably larger, although the theoretical complexity is still cubic in the number of covariates and quadratic in the number of subjects. We conclude that this is a promising way of performing parametric survival analysis on vertically-distributed medical data, while realising high level of security and privacy
- âŠ